Forget and Rewire: Enhancing the Resilience of Transformer-based Models against Bit-Flip Attacks

Authors: 

Najmeh Nazari, Hosein Mohammadi Makrani, and Chongzhou Fang, University of California, Davis; Hossein Sayadi, California State University, Long Beach; Setareh Rafatirad, University of California, Davis; Khaled N. Khasawneh, George Mason University; Houman Homayoun, University of California, Davis

Abstract: 

Bit-Flip Attacks (BFAs) involve adversaries manipulating a model's parameter bits to undermine its accuracy significantly. They typically target the most vulnerable parameters, causing maximal damage with minimal bit-flips. While BFAs' impact on Deep Neural Networks (DNNs) is well-studied, their effects on Large Language Models (LLMs) and Vision Transformers (ViTs) have not received the same attention. Inspired by "brain rewiring," we explore enhancing Transformers' resilience against such attacks. This potential lies in the unique architecture of transformer-based models, particularly their Linear layers. Our novel approach, called Forget and Rewire (FaR), strategically applies rewiring to Linear layers to obfuscate neuron connections. By redistributing tasks from critical to non-essential neurons, we reduce the model's sensitivity to specific parameters while preserving its core functionality. This strategy thwarts adversaries' attempts to identify and target crucial parameters using gradient-based algorithms. Our approach conceals pivotal parameters and enhances robustness against random attacks. Comprehensive evaluations across widely used datasets and Transformer frameworks show that the FaR mechanism significantly reduces BFA success rates by 1.4 to 4.2 times with minimal accuracy loss (less than 2%).

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.

BibTeX
@inproceedings {299862,
author = {Najmeh Nazari and Hosein Mohammadi Makrani and Chongzhou Fang and Hossein Sayadi and Setareh Rafatirad and Khaled N. Khasawneh and Houman Homayoun},
title = {Forget and Rewire: Enhancing the Resilience of Transformer-based Models against {Bit-Flip} Attacks},
booktitle = {33rd USENIX Security Symposium (USENIX Security 24)},
year = {2024},
isbn = {978-1-939133-44-1},
address = {Philadelphia, PA},
pages = {1349--1366},
url = {https://www.usenix.org/conference/usenixsecurity24/presentation/nazari},
publisher = {USENIX Association},
month = aug
}

Presentation Video